Efficiency Evaluation of Character-level RNN Training Schedules
نویسندگان
چکیده
We present four training and prediction schedules from the same character-level recurrent neural network. The efficiency of these schedules is tested in terms of model effectiveness as a function of training time and amount of training data seen. We show that the choice of training and prediction schedule potentially has a considerable impact on the prediction effectiveness for a given training budget.
منابع مشابه
Make Deep Learning Great Again: Character-Level RNN Speech Generation in the Style of Donald Trump
A character-level recurrent neural network (RNN) is a statistical language model capable of producing text that superficially resembles a training corpus. The efficacy of these networks in mimicking Shakespeare, Linux source code, and other forms of text have already been demonstrated. In this paper, we show that character-level RNNs are capable of very believably mimicking the language of Pres...
متن کاملOnline Keyword Spotting with a Character-Level Recurrent Neural Network
In this paper, we propose a context-aware keyword spotting model employing a character-level recurrent neural network (RNN) for spoken term detection in continuous speech. The RNN is end-toend trained with connectionist temporal classification (CTC) to generate the probabilities of character and word-boundary labels. There is no need for the phonetic transcription, senone modeling, or system di...
متن کاملA New Hybrid-parameter Recurrent Neural Networks for Online Handwritten Chinese Character Recognition
The recurrent neural network (RNN) is appropriate for dealing with temporal sequences. In this paper, we present a deep RNN with new features and apply it for online handwritten Chinese character recognition. Compared with the existing RNN models, three innovations are involved in the proposed system. First, a new hidden layer function for RNN is proposed for learning temporal information bette...
متن کاملGenerating Text with Recurrent Neural Networks
Recurrent Neural Networks (RNNs) are very powerful sequence models that do not enjoy widespread use because it is extremely difficult to train them properly. Fortunately, recent advances in Hessian-free optimization have been able to overcome the difficulties associated with training RNNs, making it possible to apply them successfully to challenging sequence problems. In this paper we demonstra...
متن کاملGated Feedback Recurrent Neural Networks
In this work, we propose a novel recurrent neural network (RNN) architecture. The proposed RNN, gated-feedback RNN (GF-RNN), extends the existing approach of stacking multiple recurrent layers by allowing and controlling signals flowing from upper recurrent layers to lower layers using a global gating unit for each pair of layers. The recurrent signals exchanged between layers are gated adaptiv...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1605.02486 شماره
صفحات -
تاریخ انتشار 2016